Methods for Combining Heterogeneous Sets of Classifiers
نویسندگان
چکیده
The combination of classifiers has long been proposed as a method to improve the accuracy achieved in isolation by a single classifier. In contrast to such wellexplored methods as boosting and bagging, we are interested in ensemble methods that allow the combination of heterogeneous sets of classifiers, which are classifiers built using differing learning paradigms. We focus on theoretical and experimental comparison of five such combination methods: majority vote, a Bayesian method, a Dempster-Shafer method, behavior-knowledge space, and logistic regression. We have developed an upper bound on the accuracy that can be obtained by any of the five methods of combination, and can show that this estimate can be used to determine whether an ensemble may improve the performance of its members. We have conducted a series of experiments using standard data sets and learning methods, and compared experimental results to theoretical expectations.
منابع مشابه
Hybrid Intrusion Detection Using Ensemble of Classification Methods
One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed for homogeneous ensemble classifiers using bagging and heterogeneous ensemble classifiers using arcing classifier and their performa...
متن کاملEvaluation of Ensemble Classifiers for Handwriting Recognition
One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed for homogeneous ensemble classifiers using bagging and heterogeneous ensemble classifiers using arcing classifier and their performa...
متن کاملAdaptive boosting techniques in heterogeneous and spatial databases
Combining multiple classifiers is an effective technique for improving classification accuracy by reducing the variance through manipulating the training data distributions. In many large-scale data analysis problems involving heterogeneous databases with attribute instability, however, standard boosting methods do not improve local classifiers (e.g. k-nearest neighbors) due to their low sensit...
متن کاملCombining Heterogeneous Sets of Classifiers: Theoretical and Experimental Comparison of Methods
In recent years, the combination of classifiers has been proposed as a method to improve the accuracy achieved in isolation by a single classifier. We are interested in ensemble methods that allow the combination of heterogeneous sets of classifiers, which are classifiers built using differing learning paradigms. We focus on theoretical and experimental comparison of five such combination metho...
متن کاملاستفاده از یادگیری همبستگی منفی در بهبود کارایی ترکیب شبکه های عصبی
This paper investigates the effect of diversity caused by Negative Correlation Learning(NCL) in the combination of neural classifiers and presents an efficient way to improve combining performance. Decision Templates and Averaging, as two non-trainable combining methods and Stacked Generalization as a trainable combiner are investigated in our experiments . Utilizing NCL for diversifying the ba...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2000